A Post by Michael B. Spring

(A list of all posts by M.B. Spring)

So What is Information (September 25, 2008)

I have been a faculty member in the School of Information Sciences -- formerly the Graduate School of Library and Information Science -- for more than two decades now. When I arrived, the name of the department was the Interdisciplinary Department of Information Science. In the early 1990's it was changed to the Department of Information Science and Telecommunications. Recently, as part of a school wide restructuring, it became the Graduate Information Science and Technology Program. The constant in all the changes and evolution would seem to be Information Science -- or the Science of Information. I will address the “science” of information in a future piece focusing on King and Brownell's work on communities of discourse and Simon's thoughts on sciences of the artificial. For now, I would like to focus on information.

Information is in the “I” of the beholder

Let's begin simply. My birth date is May 27th. That is probably information for you. It would not be information for my mother were she still alive. You may not value the information, but I suspect you would agree that it was something you did not know. Somewhere, the "fact" that May 27th is my birthday is recorded, but that is not information in an of itself -- we refer to it as a record. It might or might not be information to you. Thus we have some first conversational concepts that we can use to define information. Let's begin with the fact that the measure of information is relative to the recipient. What is information to one human is not information to another. Is it possible to "inform" a building, or a computer, or an automobile? I am not quite sure how to answer that. On the one hand, I am prepared to state that what we refer to as information is tightly bound to the human experience. Some of my colleagues would argue that ant scent trails constitute information, and that computers produce information displays -- sometimes regardless of whether they are used by humans. I am prepared to engage these arguments and be convinced that information is a concept that has a scope beyond the human experience, but for now, I will ask you to accept a temporary stipulation that sans humans there is no information. The reason for asking for this stipulated limit is a desire to be able to easily build a more complete definition. If we expand the scope too quickly to all these arguable extensions, some of the points I wish to make will be much more difficult. So for this argument, information is a function of the human experience. If a tree falls in the wood and there is no human, it may make a sound, but we do not have any information about the fact that it made a sound, and we do not know that it fell, nor why it fell.

We may now take another step. We are stipulating that the receiver is restricted to entities we call humans. Further I propose that the measure of information is dependent upon the receiver. We need to somehow qualify what it is that makes something information to one human but not another. We know what it is informally. Let's see if we can make it a little more formal. If I already have some information, receipt of the same fact, is not information. Messages contain information when the message is about something not already known to the receiver. At the risk of moving too fast, what I know, the knowledge I have, acts as the mediator of whether a message contains information.

This leaves us with several concepts that are of use in furthering our inquiry. The first is the notion of a store of information which we will call knowledge. The second is the notion of messages which are delivered to us. The third is the notion of the contents of the message measured along a dimension we call information. A message may contain no information, a little information, or a lot of information. We may have a lot of knowledge, a little knowledge, or no knowledge. What we know may be partitioned into domains. I know a lot about Pittsburgh, a little about Bangkok and New York, and next to nothing about Nairobi. We could go on for a while here, but let me simply add one more caveat at this level. Information may be true or false. I may be misinformed. I may receive a lot of bad information. Pretty neat. Ok, there is a lot more at this level, like the value of information, but we will leave those discussions for now and turn to the messages, and then to disembodiment of information.

Messages

We suggested above that "messages may carry information". What is a message? Again, I am going to suggest that a message is something received by a human. Unlike information, it would be hard to argue that only humans process messages. Here I would have to agree that computers process messages and that the scent trails laid down by ants constitute messages to other ants. So I fully agree that a message is a very general concept and surely not restricted to the human experience. At the same time, I will stipulate that for now, I am only speaking about messages that are received by humans. I would suggest that there are two broad classes of messages that are processed by humans. The first is messages that consist of raw data processed from the environment. The sound of screeching brakes, the sound of water running, the smell of smoke, a sunset, the warmth of the summer sun. All of these experiences, whether direct or indirect (e.g. a picture of a sunset, or a recording of a gun shot), may contain information. We don't always define these experiences as "messages" and that is fair, but I would argue that these sensations or experiences should be defined as low grade messages. There is much to be discussed here about how these patterns of signals move from signal to data (pattern) to information -- it is raining outside. It has a lot to do with the knowledge we bring to bare on the signals. Those signals that make no sense -- that form no pattern -- are commonly referred to as noise. One of the goals of natural science is to turn noise into information. So at this level of messaging, we begin to introduce the concept of noise as meaningless (information less) patterns. We can come back to this concept and mine it further, but for this discussion, I want to turn from low grade messages, to high grade messages.

If we agree that we can grudgingly refer to sensory experiences as messages that can carry information, what is it that we really want to think of as a message? I think it is pretty easy to agree that a message involves a sender and a receiver and that it is pretty easy to imagine one human sending a message to another human. Again, I acknowledge that “message” is a very general concept, but here I am talking about exchanges between humans. An email message constitutes a first class message. It might be considered a "prototype" for messages, in the same way that psychologists suggest that a robin is the "prototype" for bird. This does not mean that a penguin or ostrich is not a bird, they are just not as prototypical as a robin. Let's expand our discussion about human to human messages. If my sister kicks me under the table during a conversation over Thanksgiving dinner, she sends me a message -- probably about the foot I am about to stick in my mouth. So, a hug, a kiss, a punch, can all be messages exchanged between humans. And for millions of years, that was how humans exchanged messages. With the development of spoken language about forty thousand years ago, our ability to exchange messages greatly increased. Four to five thousand years ago with the advent of written language, our ability to exchange messages was increased again. Spoken language allowed for same-time-same-place messaging. The technology of written language allowed for messaging across time and space. We did not have to be at Gettysburg on the afternoon of Thursday November 16, 1863 to get the message from the president of the United States -- "that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion." So, information may be contained in messages which vary from the "low grade" messages that contain raw sensory data that must be interpreted to the "high grade" messages that consist of sets of symbols that are constructed explicitly to facilitate the communication of information between humans. Indeed, I would suggest that we could profitably restrict the study of information to these high grade messages consisting of symbols. Once we understand information in this form, we can extend our exploration to all the other forms -- including the scent trails of ants. In this discussion, I turn to one final topic in my trail -- and that is not the scent trail, but coding and the beginning of formalizing a definition for information.

Coding information

For several thousand years, we have been using language to exchange information in encoded form. To a large extent this is what I was referring to when I talked about the disembodiment of information. This has been a great boon to the advancement of civilization. We have not only encoded the individual pieces -- I was tempted to say bits, but that would be premature -- but we have organized these pieces and begun the process of assessing the validity of the information. This process is deserving of study in its own right. We can say that there was a lot of information in a given book. We can identify new information. We can label information public or private. We can store, transmit and access ever greater amounts of information in various forms of messages. Can we measure information? I don't think we yet have adequate ways to do this, but there have been some interesting developments.

One of the most interesting came from Claude Shannon and Warren Weaver. It has been overplayed in some circles, but it is interesting for what it is. At the core, they hypothesized that one measure of information could be the probability of a given piece of information. If the probability of a given message is unity -- there is no information. If the probability of a message is 50% there is some information in the message. If the probability of a message is 25% there is more information in the message, etc. Working for Bell Labs, Shannon was interested in how much bandwidth was needed to communicate a message, or how much space was needed to store a message. With the advent of digital computer using binary units to store a message, it became useful to think about how many binary digits would be required to store a message. Shannon suggested (a simplistic explanation) that one measure of information was "I=log 1/probability of the message". We can thus say that if information is to be stored as an array of binary digits and the probability of the message is .5, we would need log2 1/.5 bits to store the message. The base 2 log of 1/.5 (i.e. 2) is 1. When the probability of a message is 50/50, I can record the message as a 0 or 1 in one binary digit. If the probability of a given message is .00390625 I would need log2 1/.00390625 or 8 bits. In this case, my magic number of .0039 is 1/256. What Shannon is saying is that if I wish to be able to have a message that can be any one of 256 different symbols, I would need 8 bits to represent it. A rich analysis of information is possible based on a measure of information as the probability of a message. It opens the doors to computation, transmission, encryption, compression, correction of messages stored in digital form. It provides a simple yet rich mathematical theory that allows us to do all sorts of things, and it is completely consistent with the discussion of information being put forward here. While Shannon and Weaver's definition is direct, elegant and powerful, I believe it lacks some of the richer notions I have tried to put forward here. It is not that their definition is wrong. Rather we require complementary definitions that allow us examine other aspects of the phenomenon.

Conclusion

I have attempted here to argue for a definition of information that allows us to meaningfully partition the space and study the phenomenon. The core of my argument is based on information as a part of the human experience. Without humans, we can't talk about information. With humans, we can talk about information as a measure of the degree to which a message transforms the state of awareness, the knowledge structure of the receiver. We can partition messages into at least two groups -- those received via direct observation of natural phenomena and those received via some form of symbolic communication from another human. I would argue that while a comprehensive study of information is desirable, it may be more productive to begin is with the analysis of symbolic messages between humans. Based on models developed in a simplified context, the theory and concepts of information that might be later extended more broadly.

So this is how I would begin the definition of information.